Skip to main content

Open WebUI

Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline. It supports Ollama and OpenAI-compatible APIs, making it a powerful, provider-agnostic solution for both local and cloud-based models.

Open WebUI Demo

GitHub stars GitHub forks Discord


Quick Start

docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Then open http://localhost:3000.

For GPU support, Docker Compose, and more → Full Docker guide

Running? Read this next.

Installed Open WebUI but not sure where to start? The Essentials for Open WebUI guide covers the six things every new user needs to know: plugins, tool calling, task models, context management, RAG, and Open Terminal.


Getting Started


Explore


Enterprise

Need custom branding, SLA support, or Long-Term Support (LTS) versions? → Learn about Enterprise plans


Get Involved


Sponsors


Acknowledgements

This content is for informational purposes only and does not constitute a warranty, guarantee, or contractual commitment. Open WebUI is provided "as is." See your license for applicable terms.